Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 138
Filter
1.
Transplant Proc ; 56(3): 505-510, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38448249

ABSTRACT

BACKGROUND: Postoperative delirium after organ transplantation can lead to increased length of hospital stay and mortality. Because pain is an important risk factor for delirium, perioperative analgesia with intrathecal morphine (ITM) may mitigate postoperative delirium development. We evaluated if ITM reduces postoperative delirium incidence in living donor kidney transplant (LDKT) recipients. METHODS: Two hundred ninety-six patients who received LDKT between 2014 and 2018 at our hospital were retrospectively analyzed. Recipients who received preoperative ITM (ITM group) were compared with those who did not (control group). The primary outcome was postoperative delirium based on the Confusion Assessment Method for Intensive Care Unit results during the first 4 postoperative days. RESULTS: Delirium occurred in 2.6% (4/154) and 7.0% (10/142) of the ITM and control groups, respectively. Multivariable analysis showed age (odds ratio [OR]: 1.07, 95% CI: 1.01-1.14; P = .031), recent smoking (OR: 7.87, 95% CI: 1.43-43.31; P = .018), preoperative psychotropics (OR: 23.01, 95% CI: 3.22-164.66; P = .002) were risk factors, whereas ITM was a protective factor (OR: 0.23, 95% CI: 0.06-0.89; P = .033). CONCLUSIONS: Preoperative ITM showed an independent association with reduced post-LDKT delirium. Further studies and the development of regional analgesia for delirium prevention may enhance the postoperative recovery of transplant recipients.


Subject(s)
Analgesics, Opioid , Delirium , Injections, Spinal , Kidney Transplantation , Living Donors , Morphine , Pain, Postoperative , Humans , Kidney Transplantation/adverse effects , Morphine/administration & dosage , Male , Female , Pain, Postoperative/prevention & control , Pain, Postoperative/etiology , Middle Aged , Retrospective Studies , Delirium/prevention & control , Delirium/etiology , Delirium/epidemiology , Analgesics, Opioid/administration & dosage , Adult , Risk Factors , Psychomotor Agitation/prevention & control , Psychomotor Agitation/etiology , Postoperative Complications/prevention & control , Preoperative Care
2.
Transplant Proc ; 56(3): 686-691, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38378341

ABSTRACT

BACKGROUND: Xenotransplantation, particularly when involving pig donors, presents challenges related to the transmission of porcine cytomegalovirus (pCMV) and its potential impact on recipient outcomes. This study aimed to investigate the relationship between pCMV positivity in both donors and recipients and the survival time of cynomolgus monkey recipients after xenogeneic kidney transplantation. METHODS: We conducted 20 cynomolgus xenotransplants using 18 transgenic pigs. On the surgery day, donor pig blood was sampled, and DNA was extracted from serum and peripheral blood mononuclear cells. Recipient DNA extraction followed the same protocol from pre-transplantation to post-transplantation. Porcine cytomegalovirus detection used real-time polymerase chain reaction (real-time PCR) with the ViroReal kit, achieving a sensitivity of 50 copies/reaction. A Ct value of 37.0 was the pCMV positivity threshold. RESULTS: Of 20 cynomolgus recipients, when donors tested negative for pCMV, recipients also showed negative results in 9 cases. In 4 cases where donors were negative, recipients tested positive. All 5 cases with pCMV-positive donors resulted in positive assessments for recipients. Detection of donor pCMV correlated with shorter recipient survival. Continuous recipient positivity during observation correlated with shorter survival, whereas transient detection showed no significant change in survival rates. However, donor pig phenotypes and transplantation protocols did not significantly impact survival. CONCLUSION: The detection of pCMV in both donors and recipients plays a crucial role in xenotransplantation outcomes. These findings suggest the importance of monitoring and managing pCMV in xenotransplantation to enhance long-term outcomes.


Subject(s)
Cytomegalovirus Infections , Cytomegalovirus , Kidney Transplantation , Macaca fascicularis , Transplantation, Heterologous , Animals , Transplantation, Heterologous/adverse effects , Swine , Cytomegalovirus/genetics , Cytomegalovirus Infections/mortality , Cytomegalovirus Infections/virology , Kidney Transplantation/adverse effects , Graft Survival , Tissue Donors , Animals, Genetically Modified
3.
Transplant Proc ; 56(3): 705-711, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38395660

ABSTRACT

BACKGROUND: Although non-human primates are the closest animals to humans to simulate physiological and metabolic responses, there is a paucity of primate hemorrhagic shock models that are standardized and reproducible. Herein, we describe a model that is a clinical replica of extreme class IV hemorrhagic shock with a step-by-step description of the procedure in cynomolgus macaque monkeys. METHODS: The physiological changes that occurred during the process were evaluated using hemodynamic parameters, echocardiogram, and laboratory values. Five female monkeys were subjected to trauma laparotomy, followed by cannulation of the abdominal aorta to achieve graded hemorrhage. A central line was placed in the right internal jugular vein, which was subsequently used for laboratory sampling and volume resuscitation. The withdrawal of blood was ceased when a predefined cardiac endpoint with cardiac arrhythmia or bradycardia was reached. The animals were then immediately resuscitated with transfusion. The primary cardiac endpoint was consistently reached in all 5 animals during the fourth hemorrhage when more than 70% of the estimated total blood volume was lost. RESULTS: No mortality occurred during the process. The blood pressure, cardiac output measured from an echocardiogram, and hemoglobin correlated well with increasing loss of circulating volume, whereas the pulse pressure variation did not. The echocardiogram was also a useful predictor for urgent volume replacement. CONCLUSION: This model offers a safe and reproducible surgical hemorrhagic model in non-human primates and simulates clinical practice. This could provide a useful platform on which further studies can be carried out to address unanswered questions in trauma management.


Subject(s)
Disease Models, Animal , Hemodynamics , Macaca fascicularis , Shock, Hemorrhagic , Animals , Shock, Hemorrhagic/physiopathology , Shock, Hemorrhagic/therapy , Female , Reproducibility of Results , Blood Pressure , Resuscitation/methods , Echocardiography
4.
Biochem Biophys Rep ; 38: 101658, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38362049

ABSTRACT

Islet transplantation is the most effective treatment strategy for type 1 diabetes. Long-term storage at ultralow temperatures can be used to prepare sufficient islets of good quality for transplantation. For freezing islets, dimethyl sulfoxide (DMSO) is a commonly used penetrating cryoprotective agent (CPA). However, the toxicity of DMSO is a major obstacle to cell cryopreservation. Hydroxyethyl starch (HES) has been proposed as an alternative CPA. To investigate the effects of two types of nonpermeating CPA, we compared 4 % HES 130 and HES 200 to 10 % DMSO in terms of mouse islet yield, viability, and glucose-stimulated insulin secretion (GSIS). After one day of culture, islets were cryopreserved in each solution. After three days of cryopreservation, islet recovery was significantly higher in the HES 130 and HES 200 groups than in the DMSO group. Islet viability in the HES 200 group was also significantly higher than that in the DMSO group on Day 1 and Day 3. Stimulation indices determined by GSIS were higher in the HES 130 and 200 groups than in the DMSO group on Day 3. After three days of cryopreservation, HES 130 and HES 200 both reduced the expression of apoptosis- and necrosis-associated proteins and promoted the survival of islets. In conclusion, the use of HES as a CPA improved the survival and insulin secretion of cryopreserved islets compared with the use of a conventional CPA.

5.
Am J Nephrol ; 55(2): 245-254, 2024.
Article in English | MEDLINE | ID: mdl-38198780

ABSTRACT

INTRODUCTION: Serum activin A has been reported to contribute to vascular calcification and kidney fibrosis in chronic kidney disease. We aimed to investigate whether higher serum activin levels were associated with poor allograft outcomes in patients with kidney transplantation (KT). METHODS: A total of 860 KT patients from KNOW-KT (Korean Cohort Study for Outcome in Patients with Kidney Transplantation) were analyzed. We measured serum activin levels pre-KT and 1 year after KT. The primary outcome was the composite of a ≥50% decline in estimated glomerular filtration rate and graft failure. Multivariable cause-specific hazard model was used to analyze association of 1-year activin levels with the primary outcome. The secondary outcome was coronary artery calcification score (CACS) at 5 years after KT. RESULTS: During the median follow-up of 6.7 years, the primary outcome occurred in 109 (12.7%) patients. The serum activin levels at 1 year were significantly lower than those at pre-KT (488.2 ± 247.3 vs. 704.0 ± 349.6). When patients were grouped based on the median activin level at 1 year, the high-activin group had a 1.91-fold higher risk (95% CI, 1.25-2.91) for the primary outcome compared to the low-activin group. A one-standard deviation increase in activin levels as a continuous variable was associated with a 1.36-fold higher risk (95% CI, 1.16-1.60) for the primary outcome. Moreover, high activin levels were significantly associated with 1.56-fold higher CACS (95% CI, 1.12-2.18). CONCLUSION: Post-transplant activin levels were independently associated with allograft functions as well as coronary artery calcification in KT patients.


Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Cohort Studies , Treatment Outcome , Graft Survival , Allografts , Activins , Risk Factors
6.
Transplantation ; 108(5): 1239-1248, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38291579

ABSTRACT

BACKGROUND: Vascular calcification and stiffness contribute to increased cardiovascular morbidity in patients with chronic kidney disease. This study investigated associations between serum osteoprotegerin (OPG) levels and vascular calcification or stiffness to assess cardiovascular and graft outcomes in kidney transplant patients. METHODS: The KoreaN cohort study for Outcome in patients With Kidney Transplantation was a prospective multicenter cohort study. Serum OPG levels were measured at baseline and 3 y after transplantation in 1018 patients. Patients were classified into high and low OPG groups according to median serum OPG levels. The median follow-up duration was 93.5 mo. RESULTS: The mean age was 45.8 ±â€…11.7 y and 62.9% were men. Patients with high OPG had significantly higher coronary artery calcium scores, abdominal aortic calcification scores, and brachial-ankle pulse wave velocities than those with lower OPG; these parameters remained significant for 5 y after transplantation. The 3-y OPG levels were lower than baseline values ( P < 0.001) and were positively correlated ( r = 0.42, P < 0.001). Multivariate Cox regression analysis showed that high OPG levels were significantly associated with posttransplant cardiovascular events ( P = 0.008) and death-censored graft loss ( P = 0.004). Similar findings regarding posttransplant cardiovascular events ( P = 0.012) and death-censored graft loss ( P = 0.037) were noted in patients with high OPG at the 3-y follow-up. Mediation analyses revealed that coronary artery calcium scores, abdominal aortic calcification scores, and brachial-ankle pulse wave velocities could act as mediators between serum OPG levels and posttransplant cardiovascular events. CONCLUSIONS: Serum OPG concentration is associated with vascular calcification and stiffness and could be a significant risk factor for cardiovascular outcomes and graft loss in patients undergoing kidney transplantation.


Subject(s)
Kidney Transplantation , Osteoprotegerin , Vascular Calcification , Vascular Stiffness , Humans , Kidney Transplantation/adverse effects , Male , Osteoprotegerin/blood , Female , Middle Aged , Vascular Calcification/blood , Vascular Calcification/etiology , Prospective Studies , Adult , Treatment Outcome , Republic of Korea/epidemiology , Risk Factors , Biomarkers/blood , Graft Survival , Ankle Brachial Index , Pulse Wave Analysis , Time Factors , Cardiovascular Diseases/etiology , Cardiovascular Diseases/blood , Cardiovascular Diseases/diagnosis , Graft Rejection/blood , Graft Rejection/etiology
7.
Xenotransplantation ; 31(1): e12838, 2024.
Article in English | MEDLINE | ID: mdl-38112053

ABSTRACT

BACKGROUND: αGal-deficient xenografts are protected from hyperacute rejection during xenotransplantation but are still rejected more rapidly than allografts. Despite studies showing the roles of non-Gal antibodies and αß T cells in xenograft rejection, the involvement of γδ T cells in xenograft rejection has been limitedly investigated. METHODS: Six male cynomolgus monkeys were transplanted with porcine vessel xenografts from wild-type (n = 3) or GGTA1 knockout (n = 3) pigs. We measured the proportions and T cell receptor (TCR) repertoires of blood γδ T cells before and after xenotransplant. Grafted porcine vessel-infiltrating immune cells were visualized at the end of experiments. RESULTS: Blood γδ T cells expanded and infiltrated into the graft vessel adventitia following xenotransplantation of α-Gal-deficient pig blood vessels. Pre- and post-transplant analysis of γδ TCR repertoire revealed a transition in δ chain usage post-transplantation, with the expansion of several clonotypes of δ1, δ3, or δ7 chains. Furthermore, the distinctions between pre- and post-transplant δ chain usages were more prominent than those observed for γ chain usages. CONCLUSION: γδ TCR repertoire was significantly altered by xenotransplantation, suggesting the role of γδ T cells in sustained xenoreactive immune responses.


Subject(s)
Primates , T-Lymphocyte Subsets , Animals , Male , Heterografts , Receptors, Antigen, T-Cell , Swine , Transplantation, Heterologous , Macaca fascicularis
8.
Sci Rep ; 13(1): 22387, 2023 12 16.
Article in English | MEDLINE | ID: mdl-38104210

ABSTRACT

Protocol biopsy is a reliable method for assessing allografts status after kidney transplantation (KT). However, due to the risk of complications, it is necessary to establish indications and selectively perform protocol biopsies by classifying the high-risk group for early subclinical rejection (SCR). Therefore, the purpose of this study is to analyze the incidence and risk factors of early SCR (within 2 weeks) and develop a prediction model using machine learning. Patients who underwent KT at Samsung Medical Center from January 2005 to December 2020 were investigated. The incidence of SCR was investigated and risk factors were analyzed. For the development of prediction model, machine learning methods (random forest, elastic net, extreme gradient boosting [XGB]) and logistic regression were used and the performance between the models was evaluated. The cohorts of 987 patients were reviewed and analyzed. The incidence of SCR was 14.6%. Borderline cellular rejection (BCR) was the most common type of rejection, accounting for 61.8% of cases. In the analysis of risk factors, recipient age (OR 0.98, p = 0.03), donor BMI (OR 1.07, p = 0.02), ABO incompatibility (OR 0.15, p < 0.001), HLA II mismatch (two [OR 6.44, p < 0.001]), and ATG induction (OR 0.41, p < 0.001) were associated with SCR in the multivariate analysis. The logistic regression prediction model (average AUC = 0.717) and the elastic net model (average AUC = 0.712) demonstrated good performance. HLA II mismatch and induction type were consistently identified as important variables in all models. The odds ratio analysis of the logistic prediction model revealed that HLA II mismatch (OR 6.77) was a risk factor for SCR, while ATG induction (OR 0.37) was a favorable factor. Early SCR was associated with HLA II mismatches and induction agent and prediction model using machine learning demonstrates the potential to predict SCR.


Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Graft Rejection/etiology , Risk Factors , Blood Group Incompatibility , Machine Learning , Retrospective Studies
9.
Article in English | MEDLINE | ID: mdl-37919893

ABSTRACT

Background: Immunosenescence gradually deteriorates the function of the immune system, making elderly patients susceptible to infection, while reducing rejection of organ transplants. Therefore, age-adaptive immunosuppression is necessary in the elderly. We evaluated clinical outcomes such as rejection and infection rate when using basiliximab and rabbit anti-thymocyte globulin (r-ATG) as induction agents in elderly and young organ transplant recipients. Methods: We retrospectively reviewed patients who underwent kidney transplantation (KT) between June 2011 and April 2019. We enrolled 704 adult KT patients and classified the patients into groups according to patient age. We compared the outcomes of infection and biopsy-proven acute rejection (BPAR) according to the type of induction agent (basiliximab and r-ATG [4.5 mg/kg]). Results: The patient group included 520 recipients (74.6%) in the younger recipient group and 179 recipients (25.4%) in the older recipient group. When r-ATG was used as an induction agent, BPAR within 6 months occurred less (p = 0.03); however, infections within 6 months were higher in older recipients. Deaths due to infection were more common in older recipients (p = 0.003). Conclusion: It may be necessary to use less intensive induction therapy for older recipients, of which dose reduction of r-ATG is one option.

10.
BMC Anesthesiol ; 23(1): 263, 2023 08 05.
Article in English | MEDLINE | ID: mdl-37543574

ABSTRACT

BACKGROUND: International guidelines have recommended preemptive kidney transplantation (KT) as the preferred approach, advocating for transplantation before the initiation of dialysis. This approach is advantageous for graft and patient survival by avoiding dialysis-related complications. However, recipients of preemptive KT may undergo anesthesia without the opportunity to optimize volume status or correct metabolic disturbances associated with end-stage renal disease. In these regard, we aimed to investigate the anesthetic events that occur more frequently during preemptive KT compared to nonpreemptive KT. METHODS: This is a single-center retrospective study. Of the 672 patients who underwent Living donor KT (LDKT), 388 of 519 who underwent nonpreemptive KT were matched with 153 of 153 who underwent preemptive KT using propensity score based on preoperative covariates. The primary outcome was intraoperative hypotension defined as area under the threshold (AUT), with a threshold set at a mean arterial blood pressure below 70 mmHg. The secondary outcomes were intraoperative metabolic acidosis estimated by base excess and serum bicarbonate, electrolyte imbalance, the use of inotropes or vasopressors, intraoperative transfusion, immediate graft function evaluated by the nadir creatinine, and re-operation due to bleeding. RESULTS: After propensity score matching, we analyzed 388 and 153 patients in non-preemptive and preemptive groups. The multivariable analysis revealed the AUT of the preemptive group to be significantly greater than that of the nonpreemptive group (mean ± standard deviation, 29.7 ± 61.5 and 14.5 ± 37.7, respectively, P = 0.007). Metabolic acidosis was more severe in the preemptive group compared to the nonpreemptive group. The differences in the nadir creatinine value and times to nadir creatinine were statistically significant, but clinically insignificant. CONCLUSION: Intraoperative hypotension and metabolic acidosis occurred more frequently in the preemptive group during LDKT. These findings highlight the need for anesthesiologists to be prepared and vigilant in managing these events during surgery.


Subject(s)
Anesthesia , Kidney Failure, Chronic , Kidney Transplantation , Humans , Retrospective Studies , Creatinine , Propensity Score , Graft Survival , Living Donors , Kidney Failure, Chronic/surgery , Anesthesia/adverse effects
11.
Sci Rep ; 13(1): 12914, 2023 08 09.
Article in English | MEDLINE | ID: mdl-37558742

ABSTRACT

Greater graft-failure-risk of female-to-male liver transplantation (LT) is thought to be due to acute decrease in hepatic-estrogen-signaling. Our previous research found evidence that female hepatic-estrogen-signaling decreases after 40 years or with macrosteatosis. Thus, we hypothesized that inferiority of female-to-male LT changes according to donor-age and macrosteatosis. We stratified 780 recipients of grafts from living-donors into four subgroups by donor-age and macrosteatosis and compared graft-failure-risk between female-to-male LT and other LTs within each subgroup using Cox model. In recipients with ≤ 40 years non-macrosteatotic donors, graft-failure-risk was significantly greater in female-to-male LT than others (HR 2.03 [1.18-3.49], P = 0.011). Within the subgroup of recipients without hepatocellular carcinoma, the inferiority of female-to-male LT became greater (HR 4.75 [2.02-11.21], P < 0.001). Despite good graft quality, 1y-graft-failure-probability was 37.9% (23.1%-57.9%) in female-to-male LT within this subgroup while such exceptionally high probability was not shown in any other subgroups even with worse graft quality. When donor was > 40 years or macrosteatotic, graft-failure-risk was not significantly different between female-to-male LT and others (P > 0.60). These results were in agreement with the estrogen receptor immunohistochemistry evaluation of donor liver. In conclusion, we found that the inferiority of female-to-male LT was only found when donor was ≤ 40 years and non-macrosteatotic. Abrogation of the inferiority when donor was > 40 years or macrosteatotic suggests the presence of dominant contributors for post-transplant graft-failure other than graft quality/quantity and supports the role of hepatic-estrogen-signaling mismatch on graft-failure after female-to-male LT.


Subject(s)
Liver Transplantation , Male , Humans , Female , Liver Transplantation/adverse effects , Liver Transplantation/methods , Living Donors , Treatment Outcome , Risk Factors , Tissue Donors , Liver/pathology , Graft Survival , Retrospective Studies
12.
Xenotransplantation ; 30(5): e12814, 2023.
Article in English | MEDLINE | ID: mdl-37493436

ABSTRACT

Xenotransplantation using pigs' liver offers a potentially alternative method to overcome worldwide donor shortage, or more importantly as a bridge to allotransplantation. However, it has been challenged by profound thrombocytopenia and fatal coagulopathy in non-human primate models. Here we suggest that a left auxiliary technique can be a useful method to achieve extended survival of the xenograft. Fifteen consecutive liver xenotransplants were carried out in a pig-to-cynomolgus model. Right auxiliary technique was implemented in two cases, orthotopic in eight cases, and left auxiliary in five cases. None of the right auxiliary recipients survived after surgery due to hemorrhage during complex dissection between the primate's right lobe and inferior vena cava. Orthotopic recipients survived less than 7 days secondary to profound thrombocytopenia and coagulopathy. Two out of five left auxiliary xenotransplants survived more than 3 weeks without uncontrolled thrombocytopenia or anemia, with one of them surviving 34 days, the longest graft survival reported to date. Left auxiliary xenotransplant is a feasible approach in non-human primate experiments, and the feared risk of thrombocytopenia and coagulopathy can be minimized. This may allow for longer evaluation of the xenograft and help better understand histopathological and immunological changes that occur following liver xenotransplantation.


Subject(s)
Blood Coagulation Disorders , Liver Transplantation , Thrombocytopenia , Animals , Humans , Swine , Transplantation, Heterologous/methods , Liver Transplantation/methods , Graft Rejection , Animals, Genetically Modified , Primates , Liver/surgery , Thrombocytopenia/surgery , Macaca fascicularis
13.
Front Surg ; 10: 1209698, 2023.
Article in English | MEDLINE | ID: mdl-37377670

ABSTRACT

Background: A high rate of locoregional recurrence is one of the major difficulties in successful treatment of retroperitoneal sarcoma (RPS). Although pre-operative radiation therapy (RT) is considered a potential way to improve local recurrence, concerns about the associated treatment toxicity and risk of peri-operative complications need to be addressed. Hence, this study investigates the safety of pre-operative RT (preRTx) for RPS. Methods: A cohort of 198 patients with RPS who had undergone both surgery and RT was analyzed for peri-operative complications. They were divided into three groups according to the RT scheme: (1) preRTx group, (2) post-operative RT without tissue expander, and (3) post-operative RT with tissue expander. Results: The preRTx was overall well tolerated and did not affect the R2 resection rate, operative time, and severe post-operative complications. However, the preRTx group was associated with higher incidence of post-operative transfusion and admission to intensive care unit (p = 0.013 and p = 0.036, respectively), where preRTx was an independent risk factor only for the post-operative transfusion (p = 0.009) in multivariate analysis. The median radiation dose was the highest in preRTx group, although no significant difference was demonstrated in overall survival and local recurrence rate. Conclusion: This study suggests that the preRTx does not add significant post-operative morbidity to the patients with RPS. In addition, radiation dose elevation is achievable with the pre-operative RT. However, a meticulous intra-operative bleeding control is recommended in those patients, and further high-quality trials are warranted to evaluate the long-term oncological outcomes.

14.
Sci Rep ; 13(1): 8406, 2023 05 24.
Article in English | MEDLINE | ID: mdl-37225750

ABSTRACT

Supraglottic airway (SGA) may have advantages over endotracheal tube (ETT) regarding laryngospasm, coughing, sore throat, and hemodynamic changes; however, studies on the use of SGA in laparoscopic donor nephrectomy (LDN) are lacking. Here, we aimed to confirm the safety and feasibility of second-generation SGA in LDN and compare them with those of ETT. Enrolled adult donors (aged > 18 years) who underwent LDN between August 2018 and November 2021 were divided into two groups-ETT vs. SGA. Airway pressure, lung compliance, desaturation, and hypercapnia were recorded during surgery. After propensity score matching for baseline characteristics and surgical duration, 82 and 152 donors were included in the ETT and SGA groups, respectively, and their outcomes were compared. The peak airway pressure was lower in the SGA group than in the ETT group 5 min after pneumoperitoneum. Dynamic lung compliance was higher in the SGA group than in the ETT group during surgery. There were no cases of intraoperative desaturation, hypercapnia, or postoperative aspiration pneumonitis. The use of second-generation SGA, a safe alternative to ETT for LDN, resulted in reduced airway resistance and increased lung compliance, which suggests its benefits for airway management in kidney donors.


Subject(s)
Hypercapnia , Laparoscopy , Adult , Humans , Tissue Donors , Laparoscopy/adverse effects , Airway Management , Nephrectomy/adverse effects
15.
Transplant Proc ; 55(4): 769-776, 2023 May.
Article in English | MEDLINE | ID: mdl-37062613

ABSTRACT

Subclinical rejection (SCR) is associated with chronic allograft nephropathy. Therefore, early detection and treatment of SCR through a protocol biopsy (PB) can reduce the incidence of pathologic changes. This study evaluates the impact of early detection and treatment of SCR using a routine PB 2 weeks after kidney transplantation (KT) by examining histologic outcomes 1 year later. We reviewed 624 KT recipients at the Samsung Medical Center between August 2012 and December 2018. Protocol biopsy was planned 2 weeks and 1 year after transplantation. We compared the histologic changes between the 2 biopsies. After a propensity score matching analysis, we divided the patients into 2 groups: the proven normal group (n = 256) and the rejection group (n = 96) at the PB taken 2 weeks post-transplant. The rejection group showed no significant difference from the normal group in the flow of graft function or the Kaplan-Meier curve for graft survival. In the histologic outcomes, the pathologic differences between the groups significantly improved between the 2 time points. Treating SCR through a PB 2 weeks after KT can contribute to the maintenance of graft function and improve histologic changes 1 year after KT.


Subject(s)
Glomerulosclerosis, Focal Segmental , Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Graft Rejection/epidemiology , Biopsy , Graft Survival , Glomerulosclerosis, Focal Segmental/pathology , Kidney/pathology
16.
Clin Microbiol Infect ; 29(7): 911-917, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36868356

ABSTRACT

OBJECTIVES: Kidney transplant (KT) recipients have an increased risk of herpes zoster (HZ) and its complications. Although recombinant zoster vaccine is favoured over zoster vaccine live (ZVL), ZVL is also recommended to prevent HZ for KT candidates. We aimed to evaluate the clinical effectiveness of ZVL in KT recipients immunized before transplantation. METHODS: Adult patients who received kidney transplantation from January 2014 to December 2018 were enrolled. Patients were observed until HZ occurrence, death, loss of allograft, loss to follow-up, or 5 years after transplantation. The inverse probability of the treatment-weighted Cox proportional hazard model was used to compare the incidence of HZ after transplantation between vaccinated and unvaccinated patients. RESULTS: A total of 84 vaccinated and 340 unvaccinated patients were included. The median age was higher in the vaccinated group (57 vs. 54 years, p 0.003). Grafts from deceased donors were more frequently transplanted in the unvaccinated group (16.7% vs. 51.8%, p < 0.001). Five-year cumulative HZ incidence was 11.9%, which translated to 26.27 (95% CI, 19.33-34.95) per 1000 person-years. The incidence in the vaccinated and unvaccinated groups was 3.9% and 13.7%, respectively. After adjustment, vaccination showed significant protective effectiveness against HZ (adjusted hazard ratio, 0.18, 95% CI, 0.05-0.60). In addition, all four cases of disseminated zoster occurred in the unvaccinated group. DISCUSSION: Our study, the first on the clinical effectiveness of zoster vaccines for KT recipients, suggests that ZVL before transplantation effectively prevents HZ.


Subject(s)
Herpes Zoster Vaccine , Herpes Zoster , Kidney Transplantation , Adult , Humans , Herpes Zoster Vaccine/adverse effects , Cohort Studies , Retrospective Studies , Kidney Transplantation/adverse effects , Herpes Zoster/epidemiology , Herpes Zoster/prevention & control , Herpesvirus 3, Human , Vaccination , Treatment Outcome
17.
Transplant Proc ; 55(4): 756-768, 2023 May.
Article in English | MEDLINE | ID: mdl-36990887

ABSTRACT

Many studies have reported that protocol biopsy (PB) may help preserve kidney function in kidney transplant recipients. Early detection and treatment of subclinical rejection may reduce the incidence of chronic antibody-mediated rejection and graft failure. However, no consensus has been reached regarding PB effectiveness, timing, and policy. This study aimed to evaluate the protective role of routine PB performed 2 weeks and 1 year after kidney transplantation. We reviewed 854 kidney transplant recipients at the Samsung Medical Center between July 2007 and August 2017, with PBs planned at 2 weeks and 1 year after transplantation. We compared the trends in graft function, chronic kidney disease (CKD) progression, new-onset CKD, infection, and patient and graft survival between the 504 patients who underwent PB and 350 who did not undergo PB. The PB group was again divided into 2 groups: the single PB group (n = 207) and the double PB group (n = 297). The PB group was significantly different from the no-PB group in terms of the trends in graft function (estimated glomerular filtration rate). The Kaplan-Meier curve showed that PB did not significantly improve graft or overall patient survival. However, in the multivariate Cox analysis, the double PB group had advantages in graft survival, CKD progression, and new-onset CKD. PB can play a protective role in the maintenance of kidney grafts in kidney transplant recipients.


Subject(s)
Kidney Transplantation , Renal Insufficiency, Chronic , Humans , Kidney Transplantation/methods , Graft Rejection/epidemiology , Kidney , Biopsy , Graft Survival , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/surgery , Renal Insufficiency, Chronic/pathology , Allografts , Retrospective Studies , Review Literature as Topic
18.
Ultrasonography ; 42(2): 238-248, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36935601

ABSTRACT

PURPOSE: This study evaluated the role of donor kidney ultrasonography (US) for predicting functional kidney volume and identifying ideal kidney grafts in deceased donor kidney transplantation. METHODS: In total, 272 patients who underwent deceased donor kidney transplantation from 2000 to 2020 at Samsung Medical Center were enrolled. Donor kidney information (i.e., right or left) was provided to the radiologist who performed US image re-analysis. To binarize each kidney's ultrasound parameters, an optimal cutoff value for estimated glomerular filtration rate (eGFR) of less than 30 mL/min/1.73 m2 within 1 year after kidney transplantation was selected using the receiver operating characteristic curve with a specificity >60%. Cox regression analysis was performed for an eGFR less than 30 mL/min/1.73 m2 within 1 year after kidney transplantation and graft failure within 2 years after kidney transplantation. RESULTS: The product of renal length and cortical thickness was a statistically significant predictor of graft function. The odds ratios of an eGFR less than 30 mL/min/1.73 m2 within a year after kidney transplantation and the hazard ratio of graft failure within 2 years after kidney transplantation were 5.91 (P=0.003) and 5.76 (P=0.022), respectively. CONCLUSION: Preoperative US of the donor kidney can be used to evaluate donor kidney function and can predict short-term graft survival. An imaging modality such as US should be included in the donor selection criteria as an additional recommendation. However, the purpose of this study was not to narrow the expanded criteria but to avoid catastrophic consequences by identifying ideal donor kidneys using preoperative US.

19.
Front Immunol ; 14: 1139980, 2023.
Article in English | MEDLINE | ID: mdl-36936968

ABSTRACT

Introduction: The effect of tixagevimab/cilgavimab (Evusheld™; AstraZeneca, UK) should be evaluated in the context of concurrent outbreak situations. Methods: For serologic investigation of tixagevimab/cilgavimab during the BA.5 outbreak period, sera of immunocompromised (IC) hosts sampled before and one month after tixagevimab/cilgavimab administration and those of healthcare workers (HCWs) sampled one month after a 3rd shot of COVID-19 vaccines, five months after BA.1/BA.2 breakthrough infection (BI), and one month after BA.5 BI were investigated. Semi-quantitative anti-spike protein antibody (Sab) test and plaque reduction neutralizing test (PRNT) against BA.5 were performed. Results: A total of 19 IC hosts (five received tixagevimab/cilgavimab 300 mg and 14 received 600 mg) and 41 HCWs (21 experienced BA.1/BA.2 BI and 20 experienced BA.5 BI) were evaluated. Baseline characteristics did not differ significantly between IC hosts and HCWs except for age and hypertension. Sab significantly increased after tixagevimab/cilgavimab administration (median 130.2 BAU/mL before tixagevimab/cilgavimab, 5,665.8 BAU/mL after 300 mg, and 10,217 BAU/mL after 600 mg; both P < 0.001). Sab of one month after the 3rd shot (12,144.2 BAU/mL) or five months after BA.1/BA.2 BI (10,455.8 BAU/mL) were comparable with that of tixagevimab/cilgavimab 600 mg, while Sab of one month after BA.5 BI were significantly higher (22,216.0 BAU/mL; P < 0.001). BA.5 PRNT ND50 significantly increased after tixagevimab/cilgavimab administration (median ND50 29.6 before tixagevimab/cilgavimab, 170.8 after 300 mg, and 298.5 after 600 mg; both P < 0.001). The ND50 after tixagevimab/cilgavimab 600 mg was comparable to those of five months after BA.1 BI (ND50 200.9) while ND50 of one month after the 3rd shot was significantly lower (ND50 107.6; P = 0.019). The ND50 of one month after BA.5 BI (ND50 1,272.5) was highest among tested groups, but statistical difference was not noticed with tixagevimab/cilgavimab 600 mg. Conclusion: Tixagevimab/cilgavimab provided a comparable neutralizing activity against the BA.5 with a healthy adult population who were vaccinated with a 3rd shot and experienced BA.1/BA.2 BI.


Subject(s)
Breakthrough Infections , COVID-19 , Adult , Humans , COVID-19 Vaccines
20.
Am J Transplant ; 23(4): 565-572, 2023 04.
Article in English | MEDLINE | ID: mdl-36739177

ABSTRACT

Diminished immune response to coronavirus disease 2019 (COVID-19) vaccines and breakthrough infection (BI) is a major concern for solid organ transplant recipients. Humoral and cellular immune responses of kidney transplant (KT) recipients after a third COVID-19 vaccination were investigated compared to matched health care workers. Anti-severe acute respiratory syndrome coronavirus 2 spike protein antibody and severe acute respiratory syndrome coronavirus 2 specific interferon-gamma releasing assay (IGRA) were assessed. A total of 38 KT recipients, including 20 BI and 18 noninfection, were evaluated. In the KT BI group, antibody titers were significantly increased (median 5 to 724, binding antibody units/mL (P = 0.002) after the third vaccination, but IGRA responses were negligible. After BI, antibody titers increased (median 11 355 binding antibody unit/mL; P < 0.001) and there was a significant increase of IGRA responses to spike proteins (Spike1-Nil, median 0.05 to 0.41 IU/mL; P = 0.009). Antibody titers and IGRA responses were significantly higher in the BI than in the noninfection group after 6 months. Immune responses were stronger in the health care worker than in the KT cohort, but the gap became narrower after BI. In conclusion, KT recipients who experienced BI after 3 COVID-19 vaccinations acquired augmented humoral and cellular immune responses.


Subject(s)
COVID-19 , Kidney Transplantation , Humans , COVID-19 Vaccines , SARS-CoV-2 , COVID-19/prevention & control , Breakthrough Infections , Kidney Transplantation/adverse effects , Immunity, Cellular , Antibodies, Viral , Transplant Recipients , Vaccination , Immunity, Humoral
SELECTION OF CITATIONS
SEARCH DETAIL
...